On self-normalizing cyclic subgroups

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

p-RATIONAL CHARACTERS AND SELF-NORMALIZING SYLOW p-SUBGROUPS

Let G be a finite group, p a prime, and P a Sylow p-subgroup of G. Several recent refinements of the McKay conjecture suggest that there should exist a bijection between the irreducible characters of p′-degree of G and the irreducible characters of p′-degree of NG(P ), which preserves field of values of correspondent characters (over the p-adics). This strengthening of the McKay conjecture has ...

متن کامل

POS-groups with some cyclic Sylow subgroups

A finite group G is said to be a POS-group if for each x in G the cardinality of the set {y in G | o(y) = o(x)} is a divisor of the order of G. In this paper we study the structure of POS-groups with some cyclic Sylow subgroups.

متن کامل

Subgroups and cyclic groups

Example 1.2. (i) For every group G, G ≤ G. If H ≤ G and H 6= G, we call H a proper subgroup of G. Similarly, for every group G, {1} ≤ G. We call {1} the trivial subgroup of G. Most of the time, we are interested in proper, nontrivial subgroups of a group. (ii) Z ≤ Q ≤ R ≤ C; here the operation is necessarily addition. Similarly, Q∗ ≤ R∗ ≤ C∗, where the operation is multiplication. Likewise, μn ...

متن کامل

Subgroups of Cyclic Groups

In a group G, we denote the (cyclic) group of powers of some g ∈ G by 〈g〉 = {g : k ∈ Z}. If G = 〈g〉, then G itself is cyclic, with g as a generator. Examples of infinite cyclic groups include Z, with (additive) generator 1, and the group 2Z of integral powers of the real number 2, with generator 2. The most basic examples of finite cyclic groups are Z/(m) with (additive) generator 1 and μm = {z...

متن کامل

Self-Normalizing Neural Networks

Deep Learning has revolutionized vision via convolutional neural networks (CNNs) and natural language processing via recurrent neural networks (RNNs). However, success stories of Deep Learning with standard feed-forward neural networks (FNNs) are rare. FNNs that perform well are typically shallow and, therefore cannot exploit many levels of abstract representations. We introduce self-normalizin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Algebra

سال: 1989

ISSN: 0021-8693

DOI: 10.1016/0021-8693(89)90250-0